Xpressive Power of Recurrent Neural Net - Works

نویسندگان

  • Valentin Khrulkov
  • Alexander Novikov
چکیده

Deep neural networks are surprisingly efficient at solving practical tasks, but the theory behind this phenomenon is only starting to catch up with the practice. Numerous works show that depth is the key to this efficiency. A certain class of deep convolutional networks – namely those that correspond to the Hierarchical Tucker (HT) tensor decomposition – has been proven to have exponentially higher expressive power than shallow networks. I.e. a shallow network of exponential width is required to realize the same score function as computed by the deep architecture. In this paper, we prove the expressive power theorem (an exponential lower bound on the width of the equivalent shallow network) for a class of recurrent neural networks – ones that correspond to the Tensor Train (TT) decomposition. This means that even processing an image patch by patch with an RNN can be exponentially more efficient than a (shallow) convolutional network with one hidden layer. Using theoretical results on the relation between the tensor decompositions we compare expressive powers of the HTand TT-Networks. We also implement the recurrent TT-Networks and provide numerical evidence of their expressivity.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Recurrent Fuzzy Neural Network Controller Design for Speed and Exhaust Temperature of a Gas Turbine Power Plant

In this paper, a recurrent fuzzy-neural network (RFNN) controller with neural network identifier in direct control model is designed to control the speed and exhaust temperature of the gas turbine in a combined cycle power plant. Since the turbine operation in combined cycle unit is considered, speed and exhaust temperature of the gas turbine should be simultaneously controlled by fuel command ...

متن کامل

An Intelligent Monitoring System for Electric Power Variation in a Nuclear Power Plant

This paper presents an electric power monoring based on Artificial Neural Network (ANN) for the nuclear power plants. The Recurrent Neural Networks (RNN) and the feed-forward neural network are selected for the plant modeling and anomaly detection because of the high capability of modeling for dynamic behaviors. Two types of Recurrent Neural Networks (RNN) are used. The first one Elman type of ...

متن کامل

Electrical Load Manageability Factor analyses by Artificial Neural Network Training

On typical medium voltage feeder, Load side management means power energy consumption controlling at connected loads. Each load has various amount of reaction to essential parameters variation that collection of these reactions is mentioned feeder behavior to each parameter variation. Temperature, humidity, and energy pricing variation or major event happening and power utility announcing to th...

متن کامل

Recurrent Residual Convolutional Neural Network based on U-Net (R2U-Net) for Medical Image Segmentation

Deep learning (DL) based semantic segmentation methods have been providing state-of-the-art performance in the last few years. More specifically, these techniques have been successfully applied to medical image classification, segmentation, and detection tasks. One deep learning technique, U-Net, has become one of the most popular for these applications. In this paper, we propose a Recurrent Co...

متن کامل

Implementation of a programmable neuron in CNTFET technology for low-power neural networks

Circuit-level implementation of a novel neuron has been discussed in this article. A low-power Activation Function (AF) circuit is introduced in this paper, which is then combined with a highly linear synapse circuit to form the neuron architecture. Designed in Carbon Nanotube Field-Effect Transistor (CNTFET) technology, the proposed structure consumes low power, which makes it suitable for the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018